# Autoregressive Transformer
GP MoLFormer Uniq
Apache-2.0
GP-MoLFormer is a chemical language model pretrained on 650 million to 1.1 billion molecular SMILES string representations from ZINC and PubChem, focusing on molecular generation tasks.
Molecular Model
Transformers

G
ibm-research
122
1
Musicgen Stereo Medium
Stereo music generation model released by Meta AI, capable of generating high-quality music from text descriptions
Audio Generation
Transformers

M
facebook
303
30
Codellama 34b Hf
Code Llama is a series of large language models for code generation and understanding developed by Meta, with the 34B version being the 34-billion-parameter base model
Large Language Model
Transformers Other

C
codellama
11.90k
169
Musicgen Large
MusicGen is a text-to-music generation model capable of producing high-quality music samples based on text descriptions or audio prompts.
Audio Generation
Transformers

M
facebook
5,125
448
Musicgen Small
MusicGen is a text-to-music model that generates high-quality music samples based on text descriptions or audio prompts.
Audio Generation
Transformers

M
facebook
123.91k
429
Jasmine 350M
JASMINE is a series of Arabic GPT models designed for few-shot learning, with parameters ranging from 300 million to 6.7 billion, pretrained on 235GB of text data.
Large Language Model
Transformers

J
UBC-NLP
81
5
Bloom
Openrail
BLOOM is a large-scale multilingual generative model supporting 46 natural languages and 13 programming languages, developed by the international research organization BigScience
Large Language Model
Transformers Supports Multiple Languages

B
bigscience
3,917
4,891
Featured Recommended AI Models